|
The term Use Error has recently been introduced to replace the commonly used terms human error and user error. The new term, which has already been adopted by international standards organizations for medical devices (see "Use errors in health care" below for references), suggests that accidents should be attributed to the circumstances, rather than to the human beings who happened to be there. ==The need for the terminological change== The term "use error" was first used in May 1995 in an MD+DI guest editorial, (“The Issue Is ‘Use,’ Not ‘User,’ Error,” ) by (William Hyman ).〔http://www.mddionline.com/article/issue-use-not-user-error〕 Traditionally, human errors are considered as a special aspect of human factors. Accordingly, they are attributed to the human operator, or user. When taking this approach, we assume that the system design is perfect, and the only source for the use errors is the human operator. For example, the U.S. Department of Defense (DoD) HFACS 〔(), Department of Defense Human Factors Analysis and Classification System: A mishap investigation and data analysis tool〕 classifies use errors attributed to the human operator, disregarding improper design and configuration setting, which often result in missing alarms, or in inappropriate alerting.〔(), Weiler and Harel: Managing the Risks of Use Errors: The ITS Warning Systems Case Study〕 The need for changing the term was due to a common malpractice of the stakeholders (the responsible organizations, the authorities, journalists) in cases of accidents.〔(), Dekker: Reinvention of Human Error〕 Instead of investing in fixing the error-prone design, management attributed the error to the users. The need for the change has been pointed out by the accident investigators: * Early in 1983, Erik Hollnagel〔 (), Erik Hollnagel home page 〕 pointed out that the term Human Error refers to the outcome, not to the cause. A user action is typically classified as an error only if the results are painful 〔(), Hollnagel: Why "Human Error" is a Meaningless Concept〕 * In the story “Leap of Faith” of his book “Set Phasers on Stun”,〔(), Steve Casey: Set Phasers on Stun〕 Steve Casey suggested that the accident of the Indian Airlines Flight 605 near Bangalor in 1990 could have been avoided, should the investigators of the Air France Flight 296 accident of 1988 past the Mulhouse-Habsheim airport considered the circumstances (exceptional situation), rather than the pilots (human errors). * In his book “Managing the Risks of Organizational Accidents” (Organizational models of accidents) James Reason explained and demonstrated that often, the circumstances for accidents could have been controlled by the responsible organization, and not by the operators. * In his book “The Field Guide to Understanding Human Errors”,〔(), Sidney Dekker: The Field Guide to Understanding Human Error〕 Sidney Dekker argued that blaming the operators according to “The Old View” results in defensive behavior of operators, which hampers the efforts to learn from near-misses and from accidents. * In a recent study by Harel and Weiss 〔(), Harel & Weiss, Mitigating the Risks of Unexpected Events by Systems Engineering〕 the authors suggested that the Zeelim accident during an Israeli military exercise in 1992 could have been prevented, should the Israeli forces have focused on learning from the accident of 1990, rather than on punishing the field officers involved in the exercise. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「use error」の詳細全文を読む スポンサード リンク
|